It seems that over the years, that America has judged other cultures off of our own standards, forgetting the rich history and tradition that they have. We seem to stick our nose in the business of other countries and justify our actions as "doing it for the betterment of mankind" or to "fight the war on terror." Do you think the America has the right to play, "big brother" to the world, or will our fight for justice be our own demise?

Be specific and write with a voice here. I do not want a regurgitated piece of garbage.

Due 4-9 by 6:00 am.